Talk by Leon Gatys of the University of Tubingen. Given at the Redwood Center for Theoretical Neuroscience at UC Berkeley.
Abstract We introduce a new model of natural textures based on the feature spaces of convolutional neural networks optimised for object recognition. Samples from the model are of high perceptual quality demonstrating the generative power of neural networks trained in a purely discriminative fashion. Within the model, textures are represented by the correlations between feature maps in several layers of the network. We show that across layers the texture representations increasingly capture the statistical properties of natural images while making object information more and more explicit. Extending this framework to texture transfer, we introduce A Neural Algorithm of Artistic Style that can separate and recombine the image content and style of natural images. The algorithm allows us to produce new artistic imagery that combines the content of an arbitrary photograph with the appearance of numerous well-known artworks, thus offering a path towards an algorithmic understanding of how humans create and perceive artistic imagery.